Pythia-1B Deduplicated is a language model developed by EleutherAI specifically for interpretability research, trained on the deduplicated Pile dataset using Transformer architecture with 1 billion parameters
Large Language Model
Transformers English